implicit data

implicit data
Автоматика: косвенные данные, подразумеваемые данные

Универсальный англо-русский словарь. . 2011.

Смотреть что такое "implicit data" в других словарях:

  • Implicit data collection — is used in human computer interaction to gather data about the user in an implicit, non invasive way.OverviewThe collection of user related data in human computer interaction is used to adapt the computer interface to the end user. The data… …   Wikipedia

  • Implicit data structure — In computer science, an implicit data structure is a data structure that uses very little memory besides the actual data elements. It is called implicit because most of the structure of the elements is expressed implicitly by their order. Another …   Wikipedia

  • Implicit Web — The Implicit Web is a concept coined in 2007 to denote web sites which specialize in the synthesis of personal information gleaned from the Internet into a single, coherent picture of user behavior. Implicit data may include clickstream… …   Wikipedia

  • Implicit solvation — (sometimes known as continuum solvation) is a method of representing solvent as a continuous medium instead of individual “explicit” solvent molecules most often used in molecular dynamics simulations and in other applications of molecular… …   Wikipedia

  • Implicit Certificates — are a variant of public key certificate, such that a public key can be reconstructed from any implicit certificate, and is said then to be implicitly verified, in the sense that the only party who can know the associated private key is the party… …   Wikipedia

  • Implicit memory — is a type of memory in which previous experiences aid in the performance of a task without conscious awareness of these previous experiences.[1] Evidence for implicit memory arises in priming, a process whereby subjects show improved performance… …   Wikipedia

  • Data Intensive Computing — is a class of parallel computing applications which use a data parallel approach to processing large volumes of data typically terabytes or petabytes in size and typically referred to as Big Data. Computing applications which devote most of their …   Wikipedia

  • Data-flow analysis — is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program s control flow graph (CFG) is used to determine those parts of a program to which a particular value assigned… …   Wikipedia

  • Data compression ratio — Data compression ratio, also known as compression power, is a computer science term used to quantify the reduction in data representation size produced by a data compression algorithm. The data compression ratio is analogous to the physical… …   Wikipedia

  • Data parallelism — (also known as loop level parallelism) is a form of parallelization of computing across multiple processors in parallel computing environments. Data parallelism focuses on distributing the data across different parallel computing nodes. It… …   Wikipedia

  • Data model — Overview of data modeling context: A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional… …   Wikipedia

Книги



Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»